Sex Workers, Performers Invited to Participate in Platform Discrimination Study

Sex Workers, Performers Invited to Participate in Platform Discrimination Study

NEWCASTLE, U.K. — Digital studies and sexuality researcher Dr. Carolina Are is asking sex workers, adult performers and others who have experienced discrimination to participate in a study investigating Instagram and TikTok’s approach to malicious flagging or reporting of "gray area" content, including nudity.

Are is seeking participants over 18 years of age who have received negative comments and simultaneously had their accounts and/or content removed.

Are told XBIZ that, in the absence of communication or transparency about content governance from social media platforms, her two-year study “aims to infer Instagram and TikTok’s approach to moderating ‘grey area’ content by focusing on user experience, in the hope to gain information about platforms’ processes to make the case for fairer moderation.”

Are described “grey area” as referring to “content that social media community guidelines and moderators have so far struggled to moderate,” such as “journalism, education, nudity, activism.”

Are plans to circulate an anonymous survey and then interview specific case studies. Those wishing to share their experience with social media discrimination can fill out the survey here.

“It only takes a few minutes, and people can get into as much detail as they feel they need to,” Are explained.

A Corporate Culture of Denial About Discrimination and 'Shadowbans'

Social media platforms, Are said, “often deny that malicious reporting has an influence on accounts and content deletion, but my own personal experience and a variety of users’ experiences seem to prove otherwise. For example, every time my TikTok account was deleted — a whopping four times in 2021, with three times happening during the same week — it was after I received an avalanche of negative comments about my pole dancing posts not being appropriate for a social network ‘for children.’ While I was the one getting misogynistic insults and rape threats, it was my content that was being removed.

“We do know that once accounts are removed, it’s very difficult to speak to a human within platform teams to get them reinstated, leaving users out of a network and a tool to make a living for months, sometimes years on end.”

While Are does not think social media platforms are intentionally plotting against sex workers and sex-positive users, she points out that “given these users’ already precarious existence on social networks, and the fact that their content has been disproportionately targeted by platform governance, malicious flagging can become — and, according to some users, is already becoming — a crippling online abuse technique.”

“It’s incredibly important towards users’ online lives and livelihoods that we find out more about it,” she concluded.

Are is an Innovation Fellow at Northumbria University's Centre for Digital Citizens, researching the intersection between online abuse and censorship. Her work on social media moderation, platform governance and algorithm bias has been published in Feminist Media Studies, Porn Studies, First Monday and Journalism, and featured by the BBC, the Atlantic, MIT Technology Review, Business Insider, Vice, Wired and Mashable.

Are is also a blogger and content creator herself, as well as a writer, pole dance instructor and recipient of the Sexual Freedom Awards title, “Activist of 2019.”

Are added that although the focus of her current study is on Instagram and TikTok, if someone has experienced censorship only on one of the two platforms, that’s not a problem.

“I’m looking for an intersectional picture here,” she added, specifically inviting BIPOC, LGBTQIA+, plus-size and other marginalized users to take part.

For more information on Dr. Carolina Are, visit BloggerOnPole.com and follow her on Twitter.

Copyright © 2025 Adnet Media. All Rights Reserved. XBIZ is a trademark of Adnet Media.
Reproduction in whole or in part in any form or medium without express written permission is prohibited.

More News

Creator Law Firm 'OnlyFirm' Launches

Entertainment attorney Alex Lonstein has officially launched OnlyFirm.com for creators.

German Court Puts PornHub, YouPorn 'Network Ban' on Hold

The Administrative Court of Düsseldorf has temporarily blocked the State Media Authority of North Rhine-Westphalia (LfM) from forcing telecom providers to cut off access to Aylo-owned adult sites Pornhub and YouPorn.

FSC: NC Law Invalidating Model Contracts Takes Effect December 1

The Free Speech Coalition (FSC) announced today that North Carolina's Prevent Exploitation of Women and Minors Act goes into effect on December 1.The announcement follows:

Teasy Agency Launches Marketing Firm

Teasy Agency has officially launched Teasy Marketing firm.

Ofcom Investigates More Sites in Wake of AV Traffic Shifts

U.K. media regulator Ofcom has launched investigations into 20 more adult sites as part of its age assurance enforcement program under the Online Safety Act.

MintStars Launches Debit Card for Creators

MintStars has launched its MintStars Creator Card, powered by Payy.

xHamster Settles Texas AV Lawsuit, Pays $120,000

Hammy Media, parent company of xHamster, has settled a lawsuit brought by the state of Texas over alleged noncompliance with the state’s age verification law, agreeing to pay a $120,000 penalty.

RevealMe Joins Pineapple Support as Partner-Level Sponsor

RevealMe has joined the ranks of over 70 adult businesses and organizations committing funds and resources to Pineapple Support.

OnlyFans Institutes Criminal Background Checks for US Creators

OnlyFans will screen creators in the United States for criminal convictions, CEO Keily Blair has announced in a post on LinkedIn.

Pineapple Support to Host 'Healthier Relationships' Support Group

Pineapple Support is hosting a free online support group on enhancing connection and personal growth.

Show More